专利摘要:
METHOD FOR DETERMINING A POLARIZATION OF A SCENE, AND SYSTEM FOR ACQUIRING IMAGES FROM A SCENE Systems are described that are capable of acquiring polarimetry data using a single camera with or without a polarization filter. When a polarization filter is used, the data acquisition method comprises: (1) maneuvering the aircraft (or other vehicle) to orient the polarization filter (and the camera) in several directions when images are captured, (2) recording the various images in relation to each other, and (3) compute polarimetry values (such as Stokes' parameters) for points of interest in the images. When a polarization filter is not used, the data acquisition method comprises maneuvering the aircraft (or other vehicle) to orient the camera in several directions when images are captured and then perform the same operations (2) and (3) . These methods measure the amount of polarization in a given scene by taking multiple images from the camera at different angles.
公开号:BR102015001708B1
申请号:R102015001708-1
申请日:2015-01-26
公开日:2020-11-24
发明作者:Brian J. Tillotson;Jennifer K. Baerny
申请人:The Boeing Company;
IPC主号:
专利说明:

BACKGROUND OF THE INVENTION
[001] This description refers, in general, to the systems and methods for measuring the polarization of light in images. In particular, this description concerns the use of polarization and polarimetry to visually detect objects of interest. As used herein, the term "polarimetry" means the measurement and interpretation of the polarization of transverse waves, such as electromagnetic waves.
[002] The general problem addressed here is to improve systems and methods for measuring the polarization of light in images, specifically, images captured by moving vehicles. Polarization and polarimetry can help users to detect many objects of interest. For example, in a natural scene filled with unpolarized light, uniform surfaces appear as linearly polarized light; these uniform surfaces often correspond to artifacts, such as weapons or shot down aircraft (which are often sought after by the military) or foreign objects capable of causing damage (which most vehicle operators try to avoid). Polarization allows the view of surveillance analysts to penetrate fog or the glare of surfaces, such as water or windows. Polarization enables military analysts to find submarines and mines or snipers hidden behind windows, and enables fishermen to find schools of fish. Polarization can also help civilian users measure climate parameters or assess the health of forests and plantations.
[003] Despite these benefits, polarimetry and polarized image processing systems are rarely used. The reasons are cost, weight and reliability. In general, it is not enough to place a single polarization filter in front of a single camera. To measure the polarization in an image and to discern which parts of the image have different polarization from the other parts, you must capture an image with at least two and usually three orientations of a polarization filter. In the prior art, this meant: (1) an electrically controlled rotating filter mounted on a camera lens, (2) an electrically controlled filter wheel with several polarization filters mounted at different angles or (3) multiple cameras, each with a differently oriented polarization filter. The cost, weight and reliability penalties of these approaches have prevented most of the use of polarimetry for images taken outside a laboratory.
[004] In the case of an electrically controlled rotating filter mounted on a camera lens, a filter wheel is configured to position polarization filters with three or four different orientations in front of a single camera. A filter wheel is a reasonably robust optical component with moving parts. It is approximately as heavy as a small camera used in a typical unmanned aerial vehicle (UAV). It occupies substantial volume. With an electromechanical actuator, it is substantially less reliable. than a digital camera and therefore reduces the reliability of an aircraft mission system.
[005] A rotating polarizer in front of a single camera is smaller than a filter wheel, but is still a robust optical component with moving parts. It substantially increases the weight of a small camera and can substantially increase its volume. It contains an electromechanical actuator, which reduces the reliability of an aircraft's mission system.
[006] In the third case, a system that comprises multiple cameras facing the same direction, each with a differently oriented polarizer in front of it, imposes a small penalty of cost, weight and reliability for each camera. However, using three or four cameras instead of one increases the cost and weight and decreases the reliability of the system.
[007] According to a further development, differently oriented polarization filters are placed in front of several pixels in a charge-coupled device (CCD). A camera like this will produce a digital image structured as a figure of three or four colors, but each "color" will correspond to the intensity of a different polarization. It is not clear that a pixel-by-pixel polarization filter can be made economically. The camera does not allow real color image processing (for example, red, blue and green) competing with polarimetry. A CCD chip like this is designed to transmit four "colors" (one for each polarization), instead of the usual three expected by image file formats. This imposes technical and economic barriers to widespread acceptance.
[008] It will be desirable to provide better devices and methods for collecting visual polarimetry data from a moving vehicle (for example, an aerial vehicle) with an ideal combination of low cost, low weight and high reliability. SUMMARY OF THE INVENTION
[009] The subject in question comprises systems capable of acquiring polarimetry data using a single camera with or without a polarization filter. When a polarization filter is used, the data acquisition method comprises: (1) maneuvering the aircraft (or other vehicle) to orient the polarization filter (and the camera) in several directions when images are captured, (2) recording the various images in relation to each other and (3) compute polarimetry values (such as Stokes' parameters) for points of interest in the images. When a polarization filter is not used, the data acquisition method comprises maneuvering the aircraft (or other vehicle) to orient the camera in several directions when images are captured and then perform the same operations (2) and (3) on a computer system. These methods measure the amount of polarization in a given scene by taking multiple images from the camera at different angles.
[0010] One aspect of the subject in question described here is a method for determining the polarization of a scene, which comprises: (a) placing a linear polarization filter in the field of view of a camera comprising a lens and an array of sensors ; (b) successively locate the camera and the linear polarization filter in proximity to a single position, but in three different orientations for each of which a scene is in the camera's field of view; (c) capture first to third filtered images, at the same time that the camera and the linear polarization filter are in the three different orientations, respectively; (d) transferring up to third sets of image processing data representing, respectively, the first to third filtered images from the camera to a computer system comprising hardware and software ', and (e) computing a polarization of at least a point in the scene for the first to third sets of imaging data. The method may further comprise mounting the camera and the linear polarization filter on a vehicle, wherein step (b) comprises maneuvering the vehicle and / or recording the first to third sets of image processing data in relation to each other before to perform step (e). In the described mode, step (e) comprises computing Stokes parameters. In an implementation, respective angles around a camera's line of sight with respect to a reference to the first two of the three different orientations differ by an odd integral multiple of 45 ° and respective angles around the camera's line of sight with respect to to a reference to the second two of the three different orientations differ by 90 °.
[0011] Another aspect of the subject in question described here is a system for acquiring images of a scene, which comprises: an unmanned vehicle; a camera on board the unmanned vehicle, the camera comprising a lens and a sensor array; a first linear polarization filter disposed in front of at least a first part of the sensor array; an unmanned vehicle control system capable of controlling the unmanned vehicle to perform maneuvers, the unmanned vehicle control system comprising hardware and software, the unmanned vehicle control system software being configured to control the unmanned vehicle to position yourself in a specified position, or close to it, for each of the first, second and third occurrences and in the first, second and third orientations that are different from each other, but that each place the scene in the field viewing camera; and a camera control system arranged on board the unmanned vehicle and capable of controlling the camera to capture images, the camera control system comprising hardware and software, the camera control system software being configured to control the camera for capture first, second and third images of a target scene during the first, second and third occurrences respectively, and then transmit first, second and third sets of image processing data representing, respectively, the first, second and third the third image. The system may further comprise an image processing data processing system capable of processing image processing data, the image processing data processing system comprising hardware and software, the data processing data processing system software image being configured to record the first, second and third sets of image processing data in relation to each other and to compute polarization values for the treated image scene.
[0012] An additional aspect and method for determining the polarization of a scene, which comprises: (a) characterizing the polarization energy of a camera comprising a lens and an array of sensors; (b) successively locate the camera in proximity to a single position, but in three different orientations for each of which a scene is in the camera's field of view; (c) capture first to third images at the same time that the camera is in three different orientations, respectively; (d) transfer first, second and third image processing data sets representing the first to third captured camera images to a computer system; and (e) compute a polarization of at least one point in the scene of the first, second and third sets of imaging data. In the described modalities, step (a) comprises determining first and second elements of the Mueller matrix. In an implementation, step (a) comprises determining at least one of a first element of the Mueller matrix and a second element of the Mueller matrix for at least two positions in the sensor array, these positions corresponding to different angles of incidence for light that passes through the center of the lens.
[0013] Another aspect is a system for acquiring images of a scene, which comprises: an unmanned vehicle; a camera on board the unmanned vehicle, the camera comprising a lens and a sensor array; an unmanned vehicle control system capable of controlling the unmanned vehicle to perform maneuvers, the unmanned vehicle control system comprising hardware and software, the unmanned vehicle control system software being configured to control the unmanned vehicle to position yourself in a specified position, or close to it, for each of the first, second and third occurrences and in the first, second and third orientations that are different from each other, but that each place the scene in the field viewing camera; and a camera control system arranged on board the unmanned vehicle and capable of controlling the camera to capture images, the camera control system comprising hardware and software, the camera control system software being configured to control the camera to capture first, second and third images of a target scene during the first, second and third occurrences respectively and then transmit first, second and third sets of image processing data representing, respectively, the first, second and third third images. The system may further comprise an image processing data processing system capable of processing image processing data, the image processing data processing system comprising hardware and software, the data processing data processing system software image being configured to record the first, second and third sets of image processing data in relation to each other and to compute polarization values for the treated image scene based, in part, on the stored data representing a characterization of the image. polarization energy of the camera.
[0014] Yet another aspect is a method to measure polarization in the light coming from a scene, which comprises: (a) capturing successive images of a scene using a camera positioned close to a single position and oriented at successive different angles of orientation , in which a set of matrices that characterize a polarization energy of the camera at different angles of incidence and different angles of orientation are known and there is no polarization filter between an array of camera sensors and the scene; (b) recording the captured images in relation to each other; and (c) compute polarimetry values for light from at least one point of interest in the scene based on the captured captured images and known matrices, in which steps (b) and (c) are performed using a computer system that comprises hardware and software. According to one modality, the matrix is a Muellere matrix, the computed polarimetry values are Stokes' parameters, the polarimetry values include intensity and polarization angle; and the scene has an image treated in three different angles of orientation around an optical geometric axis of the camera, these different angles of orientation being arranged in angular intervals of 45 degrees.
[0015] An additional aspect of the subject in question described here is an empirical method to characterize a camera's polarizing energy with a lens and arrangement of the focal plane of the sensors at a specified angle of incidence of colliding light and an angle of orientation specified, the method comprising: (a) providing a target that emits non-polarized light; (b) aim the camera at the target without an intervening polarization filter and with a part of the target projected over sensors in the center of the focal plane arrangement; (c) capture a reference image while the camera is in the state described in step (b); (d) calculating a set of reference pixel values for a set of pixels in the reference image that are adjacent to a pixel produced by a sensor in the center of the focal plane array; (e) aim the camera at the target without an intervening polarization filter and with a part of the target projected on sensors near an edge or a corner of the focal plane arrangement; (f) capture a first image while the camera is in the state described in step (e); (g) calculating a first set of pixel values for a set of pixels in the first image that are adjacent to a pixel produced by a sensor near the edge or corner of the focal plane arrangement; (h) place a linear polarization filter between the camera and the target; (i) capture a second image while the camera is in the state described in steps (e) and (h); (j) calculating a second set of pixel values for a set of pixels in the second image that are adjacent to the pixel produced by the sensor near the edge or corner of the focal plane arrangement; (k) calculating a first element of a matrix based on the set of reference pixel values and the first set of pixel values; and (1) calculating a second element of the matrix based on at least the set of reference pixel values and the second set of pixel values. The exposed method can additionally comprise: (m) rotating the linear polarization filter by 90 °; (n) capturing a third image while the camera is in the state described in steps (e) and (m); and (o) calculate a third set of pixel values for a set of pixels in the third image that are adjacent to the pixel produced by the sensor near the edge or corner of the focal plane arrangement, in which, in step (1), the second element of the matrix is calculated based on at least the set of reference pixel values and the second and third sets of pixel values. Furthermore, the empirical method may comprise computing an intensity coefficient based on the set of reference pixel values and the second and third sets of pixel values. According to one embodiment, step (h) additionally comprises orienting the linear polarization filter with its geometric polarization axis parallel to one of a surface plane in the center of the camera lens or an incident plane in the center of the camera lens .
[0016] Compared to previous technology solutions, the systems described here can provide one or more of the following benefits. (1) The described systems may have lower weight, lower cost and (because no moving parts are added to the aircraft) higher reliability because they have neither a filter wheel nor a rotating polarizer. (2) The systems described may be lower in weight and cost less because they employ fewer cameras and therefore have fewer electronic components and electrical connections, resulting in higher reliability. (3) Contrary to recent developments involving polarization filters in a CCD, the systems described do not require the development of new electronic manufacturing processes, so the schedule and cost to implement them for different applications are better. Real color image processing (eg red, blue and green) is popular with users and is necessary for some applications. The systems described allow real color image processing concurrent with polarimetry. Polarization filters on a CCD do not allow this. The filter used in the systems described here is easy to remove in most modes and, therefore, allows efficient treatment of non-polarized image with the same camera at the same resolution. Polarization filters attached to a CCD are difficult or impossible to remove, so non-polarized image treatment is available only with a second (expensive) camera or by adding the intensities of neighboring pixels with different polarizations (lower photonic efficiency and resolution bottom).
[0017] Other aspects of better systems and methods for measuring light polarization in images are described below. BRIEF DESCRIPTION OF THE DRAWINGS
[0018] Figures IA, 1B and 1C are diagrams that show, respectively, a grid of pixels of the camera that overlays a target scene in respective orientations as the aircraft maneuvers to orient the camera. The dark spot in one corner of the camera's pixel grid marks the same pixel in all images. The double-headed arrows indicate respective polarization angles corresponding to the respective orientations of the linear polarization filter.
[0019] Figures 2A, 2B and 2C show images that are captured in the instances respectively shown in figures IA, 1B and 1C.
[0020] Figure 3 is a graph that represents the fact that the perspective distorts the orientation of the polarizer to pixels that are not in the center of an image.
[0021] Figures 4A and 4B are diagrams that represent top and side views of a fixed wing aircraft designed with a camera turned downwards.
[0022] Figure 5 is a diagram showing a flight path for the fixed-wing aircraft shown in Figures 4A and 4B, which flight path involves changes in direction to guide a polarization filter mounted on the camera during three successive passes on a target.
[0023] Figures 6A and 6B are diagrams that represent side and front views of a fixed wing aircraft designed with a camera facing forward with a polarization filter.
[0024] Figure 7 shows a flight path for the fixed wing aircraft 20 shown in Figures 6A and 6B, which flight path involves changes in the angle of lateral tilt to orient the camera facing forward with polarization filter during flight along a straight path (ie line of sight) directed towards a target.
[0025] Figure 8 is a diagram showing a flight path for the fixed-wing aircraft shown in Figures 6A and 6B, this flight path that involves changes in direction and changes in the angle of lateral tilt to orient the camera facing forward with polarization filter for three successive passes through the same position that is along an initial line of sight to a target.
[0026] Figure 9 is a diagram showing a camera configuration in which a polarization filter overlaps part of an array of the focal plane of pixels inside the camera.
[0027] Figure 10 is a diagram that represents a front view of the polarization filter that overlaps part of an array of the focal plane. This is the view that would be visible if the camera were divided along a plane indicated by line 10 - 10 in figure 9.
[0028] Figure 11 is a diagram that represents a frontal view of two linear polarization filters that overlap respective halves of an arrangement of the focal plane according to a modality. The two linear polarization filters are oriented perpendicular to each other.
[0029] Figure 12 is a diagram that represents a front view of four polarization filters that overlap respective quadrants of an arrangement of the focal plane. Three of the four polarization filters are linear with different orientations, while the fourth polarization filter is circular.
[0030] Figure 13 is a diagram that represents a top view of a typical camera mounted on a two-axis needle rocker.
[0031] Figure 13A is a diagram that represents a sectional view of the camera mounted on the needle rocker shown in figure 13. The needle rocker is divided along a plane indicated by line 13A - 13A in figure 13.
[0032] Figure 14 is a diagram that represents a top view of a camera mounted on a needle rocker designed to facilitate changing the orientation of a polarization filter attached to the camera.
[0033] Figure 14A is a diagram that represents a sectional view of the camera mounted on the needle rocker shown in figure 14. The needle rocker is divided along a plane indicated by line 14A - 14A in figure 14.
[0034] Figure 14B is a diagram that represents a sectional view of the camera mounted on the needle rocker after it has been rotated around the geometric axis of elevation of the previous needle rocker by 90 °.
[0035] Figure 15 is a diagram representing a side view of an unmanned aerial vehicle with a camera with a needle rocker in a sphere tower.
[0036] Figure 16 is a diagram that represents a side view of an unmanned aerial vehicle with a camera with needle balancer and a polarization filter applied in a sphere tower to polarize a part of the camera's field of consideration.
[0037] Figures 17A and 17B are diagrams showing, respectively, no polarization when the light reaches glass in perpendicular incidence (figure 17A) and stronger reflection of polarized light in s in oblique incidence that increases polarized light in p in the transmitted beam (figure 17B).
[0038] Figures 18A through 18C are diagrams showing, respectively, different polarization by a lens at different angles θ = 0 (figure 18A), ~ 20 ° (figure 18B) and ~ 40 ° (figure 18C) in relation to the axis optical geometric, corresponding to the respective different pixel positions.
[0039] Figure 19 is a diagram showing different polarization of light that passes through a lens at different angles, corresponding to different positions of an object image in an array of the focal plane. The eccentricity of the ellipse shows the degree of polarization; the ellipse orientation shows the direction of the polarization.
[0040] Figures 19A through 19C are diagrams showing, respectively, different intensities of an object in the scene at different pixel positions that reveal its polarization (line width indicates intensity) for vertically polarized light (figure 19A), horizontally polarized light (figure 19B) and non-polarized light (figure 19C).
[0041] Figure 20A is a diagram showing a sectional view of a non-tilted lens and a lens tilted at angle 0. These lenses are divided along a plane indicated by line 20A - 20A in figure 20B.
[0042] Figure 20B is a diagram showing frontal views of the lenses represented in figure 20A and other lenses tilted in different orientations ψ.
[0043] Figure 20C is a diagram that represents images of an object projected on an arrangement of the focal plane mounted coaxially with the lenses represented in figure 20B. The angle ψ corresponds to the angular position around the center of the focal plane.
[0044] Figure 21A is a diagram showing that light parallel to a geometric axis of the lens is not polarized in the center of the lens and only weakly polarized at the edge of the lens.
[0045] Figure 21B is a diagram showing that light that arrives at a large angle incurs stronger polarization at all points on the lens (The degree of polarization varies slightly across the lens surface; only the central beam is shown. ).
[0046] Figure 22 is a diagram showing the basic arrangement of a typical internal focus lens system.
[0047] Figure 23 is a diagram showing a sectional view of an arrangement of the focal plane in a typical CCD.
[0048] Figure 24 is a diagram showing an experimental configuration to characterize a camera with a CCD.
[0049] Figure 25 is a diagram showing a close-up view of part of a target in a reference image acquired at 0 = 0 °.
[0050] Figure 26 is a diagram showing three loops of a flight path for a fixed wing aircraft of the type shown in Figures 4A and 4B, except that the polarization filter has been removed from the fixed camera facing downwards. The aircraft performs three maneuvers for lateral tilt of the camera at 45 ° west, 45 ° northwest and 45 ° north level tilt angles during flight directly over a target.
[0051] Figure 27 is a block diagram that identifies the main components of a polarimetric data acquisition system according to a modality.
[0052] Each figure shown in this description shows a variation of one aspect of the presented modalities, and only differences will be discussed in detail.
[0053] Reference will be made after the drawings, in which similar elements in different drawings carry the same reference numbers. DETAILED DESCRIPTION
[0054] Various modalities will be described in order to illustrate various applications of the principles set forth herein. Although the modalities shown in the drawings and described in detail below involve mounting a camera on an aircraft (for example, a fixed-wing aircraft, such as a UAV, or a rotary-wing aircraft, such as a helicopter), it must be realized from the outset that the principles prescribed here can also be applied to spacecraft and unmanned underwater vehicles (UUVs).
[0055] According to some modalities, the system to acquire polarization values for a target with treated image comprises: an aircraft; an on-board navigation and control system capable of flying to a three-dimensional position (for example, longitude, latitude and altitude) and subsequently returning the aircraft to approximately the same position at least twice, and also capable of measuring the orientation of the aircraft in position and set the aircraft in a different selected orientation when it resumes to the same position; an on-board camera with a known orientation in relation to the aircraft; a linear polarization filter on board with a known fixed orientation relative to the camera; an on-board control system capable of controlling the camera to capture images when the aircraft arrives at the selected position with one of the selected orientations; a computer comprising hardware and software, (on board or on the ground) programmed to record the images and compute polarization values for the target with the treated image; and device for transferring images from the camera to the computer.
[0056] For those modalities with a camera and a polarization filter mounted on a fixed-wing aircraft that cannot hover in one position while taking a series of images, the main steps of the process include: (a) flying at aircraft in the direction of a position from which a target is in the camera's field of view; (b) before reaching the position, orient the aircraft in a first orientation of the aircraft corresponding to a first orientation of the filter around the line of sight to the target; (c) capture a first image of the target while the aircraft is in position and in the aircraft's first orientation; (d) fly the aircraft in the direction of the same position again; (e) before reaching the position, or close to it, a second time, orient the aircraft in a second orientation of the aircraft corresponding to a second orientation of the filter around the line of sight to the target; (f) capture a second image of the target while the aircraft is in position, or close to it, and in the aircraft's second orientation; (g) fly the aircraft in the direction of the same position again; (h) before reaching, or near, the position a third time, orient the aircraft in a third aircraft orientation corresponding to a third filter orientation around the line of sight to the target; (i) capture a third image of the target while the aircraft is in position, or close to it, and in the aircraft's third orientation; (k) transfer the image data and the data that define the three orientations to a computer; (1) perform calculations to geometrically record the images in relation to each other; and (m) calculate polarization parameters, such as Stokes' parameters, for the target image. Although it is preferred that the camera is in precisely the same position during each passage of the air vehicle in different camera orientations, versed in the air vehicle technique will recognize that such accuracy is dependent on the accuracy of the positioning system used, wind conditions and other factors .
[0057] Before describing any system in detail, it may be useful to consider why the polarizer typically has three different orientations in relation to the target. Consider partially polarized light that arrives from a target. Consider for the moment that the circular polarization is zero, so that only linear polarization is of interest. The system user wants to know how much of the light from the target is polarized and how much is not polarized, and what is the orientation of the polarized light.
[0058] To resolve the issues exposed, first, you can measure the light intensity at a polarization angle. Consider that the angle is vertical and call it zero angle. Consider that an intensity of one unit is measured. Then, you can measure the intensity at a 90 ° polarization angle, that is, horizontal polarization. This intensity is also a unit. With these two measurements, it is not possible to determine whether the light is (1) completely unpolarized with two units intensity, (2) 45 ° polarized with two units intensity or (3) 135 ° polarized with two units intensity. . This is a general problem: two measurements are never enough, regardless of the two angles you choose. To resolve the ambiguity, a third measurement is preferably taken at a polarization angle of 45 ° or 135 °. Consider using 45 °. If an intensity of zero is measured, this indicates that the light is 100% polarized at 135 °. If an intensity of two units is measured, this indicates that the light is 100% polarized at 45 °. If an intensity of a unit is measured, this indicates that the light is 100% non-polarized. Various non-integral values between zero and two units indicate the fractional polarization and the angle of the polarized part.
[0059] There are cases where information about the target scene allows a measurement to be deleted. For example, if there is only a single non-polarized light source illuminating a convex object made of optically isotropic material, then the only two orientations needed to measure optical intensity are the orientation parallel to an inlay of the object's surface and the orientation perpendicular to it inlay. There may be no 45 ° polarized light to the surface. However, such cases are rare: for most applications, it is necessary to measure the intensity in three different orientations. These orientations do not need to be separated by multiple odd and even 45 ° integers, but the associated math is easier if they are.
[0060] Versed in the technique, they know that polarization is not only linear, but also includes circular components. Most of the modalities described here in detail ignore circular polarization for the purpose of simplification with little cost in utility. Circular polarization is rare. Even when it occurs, it is usually quite weak, unless steps have been taken to produce circularly polarized light.
[0061] There are several equivalent ways of mathematically describing a given polarization state. One of these descriptions uses four parameters called Stokes' parameters. This description is easier to relate to a set of intensity measurements at various angles, so Stokes parameters are referred to in this description. Stokes parameters are often collected together in a vector of four elements called a Stokes vector.
[0062] The fourth Stokes parameter is a measure of circular polarization. Since the modalities described here largely disregard circular polarization, this description focuses on the first three Stokes parameters. The terms "Stokes parameters" and "Stokes vector" here used, typically, mean only the first three parameters or a vector of three elements of these parameters, respectively.
[0063] The four Stokes parameters are labeled I, Q, U and V. The first three are calculated from intensity measurements as follows: I = Into + Intço = Int45 + Int ^ s (1) Q = Into - Intço (2) U = Int45 - Inti35 (3) where Into, Intis, Int9o and Intns are the intensities measured at angles indicated by the subscripts and measured in degrees. In the described modalities, the system makes only three measurements. Any intensity value can be calculated from the other three, for example, data Into, Intis and Int9o, you can use the right side of Eq. (1) to calculate Inti3s: Inti35 = Into + Int90 - InUs (4 )
[0064] Once Stokes' parameters are calculated based on the angles to the camera, they can be mathematically transformed to describe the polarization in terms of any other frame of reference.
[0065] Despite the use of the term "Stokes 'parameters" in this description, it should be noted that the calculations used to determine bias values are not limited to the use of Stokes' parameters only, that is, they can be based on any representation polarization mathematics.
[0066] The methodology described here involves the acquisition of polarimetry data from a target using one in a vehicle and the processing of the acquired data using a properly programmed computer system. The camera has a polarization filter attached, so that the filter has a fixed position in relation to the camera lens.
[0067] Figures IA, 1B and 1C show a pixel grid of camera 10 that superimposes a target scene 12 in respective orientations as the aircraft maneuvers to orient the camera. In this example, the respective polarization angles (indicated by double-headed arrows) are +45 ° (figure IA), 0 ° (figure 1B) and -45 ° (figure 1C). The dark spot in one corner of the camera's pixel grid 10 marks the same pixel in all images.
[0068] Figures 2A, 2B and 2C show images captured in the instances respectively shown in figures IA, 1B and 1C. The relatively lightly shaded triangles and the relatively darkly shaded rectangle that partially overlaps the triangle represent idealized features of a target object in the target scene 12. Again, the respective polarization angles are indicated by double-headed arrows.
[0069] After the polarimetry data has been acquired, these data are transferred to a computer system for data processing. In relation to figures 2A - 2C, the gray scale value in each pixel is proportional to the intensity of polarized light with the orientation shown for the respective image. To determine the Stokes vector for a given point in a scene, the computer performs calculations using intensity values for pixels that correspond to the same point in the scene - at least one of each of three images, using the formulas of Eqs. 1) - (4). The process of aligning pixels across two or more images will be referred to here as "image registration". Many methods for image registration are well known in the technology. According to the systems described here, data on the position and orientation of the camera are usually available for each image. Therefore, image recording methods that exploit such data are typically preferred.
[0070] Up to this point in the description, the orientation of the polarizer was discussed as if it were constant across the image. It does not, as illustrated in Figure 3, graphically represent the fact that the perspective distorts the polarizer orientation to pixels that are not in the center of the image. The vertical geometric axis is elevation, while the horizontal geometric axis is azimuth. This graph illustrates the effect of placing a stable polarization filter (not shown in figure 3) with horizontal orientation in front of a camera lens (not shown). The polarization angle is indicated by a double-headed arrow. The thick curved lines labeled "local polarizer orientation" show the resulting polarization at each point in an image. Along the vertical geometric axis of the image, the polarization is horizontal. Likewise, along the horizontal geometric axis of the image, the polarization is horizontal. However, if it were conceivable a polarizer that extends infinitely to the left and right, and a camera capable of forming an image that spans 180 ° of azimuth, one would see the "horizontal" polarization lines distorted by optical perspective . At the extreme left and right, the lines "disappear" in optical infinity. Between the center of the image and the edge of the image, the orientation of the local polarizer through which light travels to the focal plane of the camera is not horizontal. The horizontal line in figure 3 shows the local horizontal in each azimuth position (It was considered that this image was taken at high altitude, so the Earth's limbus is below the zero elevation line.). The local horizontal is not parallel to the orientation of the polarizer. For any image larger than a few degrees wide, the deviation is significant and should be treated mathematically.
[0071] Methods for calculating the actual orientation of the polarizer at each point in an image are well known in the technology. The process step called "calculating polarization parameters", here, applies to one or more of these methods.
[0072] Systems and methods that use a polarization filter to measure the polarization of light in an image according to the principles described here can be incorporated in many ways. Several examples of suitable modalities will now be described in detail.
[0073] First Mode. Figures 4A and 4B are top and side views of an idealized fixed-wing aircraft 20 with a single downward facing camera 16 fixedly mounted on itself. A polarization filter 18 is mounted in such a way that it has a fixed position in relation to the camera and is arranged in front of the camera lens (not shown). The polarization angle is indicated by a double-headed arrow in figure 4A.
[0074] Figure 5 shows a flight path for the asafix aircraft 20 shown in Figures 4A and 4B, which flight path involves changes in direction to guide the polarization filter during three successive straight passes over a target 22. The Successive passages are indicated by circled numbers 1, 2 and 3, respectively. The polarization angles for the three passages are indicated by the respective double-headed arrows in figure 5.
[0075] As seen in figure 5, the aircraft can fly along a path with an interlaced pattern to capture images with different filter orientations of the same target scene (Other flight paths can be used, as long as the polarization filter 18 is oriented along three directions that differ by at least an odd 45 ° multiple and an even 45 ° multiple.)
[0076] Second Mode. According to an alternative mode, a camera pointing downwards with a polarization filter in a fixed position can be mounted on a helicopter. Because a helicopter can hover in place, the helicopter pilot can position the helicopter in a position with the target in the camera's field of view and then hover in that position. While the helicopter is hovering, the pilot can make the helicopter yaw as three images are captured by the camera at different yaw angles, thereby orienting the polarization filter in three directions while capturing the three images.
[0077] Third Mode. According to another modality, an aircraft with a single camera pointing forward or backward uses lateral tilt angles to achieve different orientations of the polarization filter. Figures 6A and 6B are side and front views of an idealized fixed-wing aircraft 20 with a single forward-facing camera 16 fixedly mounted on itself. A polarization filter 18 is mounted in such a way that it has a fixed position in relation to the camera and is arranged in front of the camera lens (not shown). The polarization angle is again indicated by a double-headed arrow in figure 6B.
[0078] Figure 7 shows a flight path for the fixed wing aircraft 20 shown in Figures 6A and 6B, which flight path involves changes in the angle of lateral inclination to guide the polarization filter during flight over a straight path 24 (ie, line of sight) directed to a target 22. Successive positions of the aircraft along the line of sight are indicated by circled numbers 1, 2 and 3, respectively. The corresponding angles of lateral inclination of the aircraft 20 are shown to the right of each circled number. The polarization angles for the three aircraft positions are indicated by respective double-headed arrows in figure 7.
[0079] For cases where the plane can roll through 90 ° and take three images of the target with adequate resolution and without significant change in line of sight to the target, the method shown in figure 7 is appropriate. The control system commands the plane to roll 45 ° to one side, commands the camera to take a picture, roll the level, take another picture, roll to the other side and take a third picture. Ideally, the second and third figures occur in positions along the line of sight from the first position of the image to the target. This ensures that the camera is sampling light with almost the same scattering angle and, therefore, the same polarization, in each image.
[0080] In cases that need more precision, or when smoke, dust, fog, etc. scatter a significant amount of light, the method of figure 8 is appropriate. Figure 8 shows a flight path for the fixed wing aircraft 20 shown in Figures 6A and 6B, which flight path involves changes in direction and changes in the angle of lateral tilt to orient the camera facing forward 16 with polarization filter 18 (see figure 6A) during three successive passes through the same position that is along an initial line of sight to a target 22. The successive legs of the flight path are indicated by circled numbers 1, 2, 3 and 4, respectively . The first leg 1 is straight and collinear with the initial line of sight of the camera to target 22. Aircraft 20 can have a tilt angle of 0 ° when the first image of target 22 is captured by the onboard camera. After the first image is captured, the aircraft 20 turns to the left and flies along a second leg 2 which circles back in a specified proximity to the position in which the first image was captured. During this second pass, aircraft 20 can have a 45 ° left-sided angle when the second image of target 22 is captured, as represented in the inserted text labeled "~ 45 ° left-sided" in the figure 8. After the second image is captured, the aircraft 20 turns to the right and flies along a third leg 3 which again circles back at a specified proximity to the position at which the first image was captured. During this third pass, aircraft 20 may have a 45 ° right-angled angle when the third image of target 22 is captured, as represented in the inserted text labeled "~ 45 ° right-angled" in the figure 8. After the third image is captured, aircraft 20 can continue to fly towards target 22 along a straight leg 4. The polarization angles for the three passes through the same position, but at different angles of lateral inclination , are indicated by respective double-headed arrows in figure 8. Within the limits of the aircraft's navigational accuracy, the plane places the camera in exactly the same position for all three photos circling and returning to the position of the first photo.
[0081] The aircraft that carries the camera and the polarization filter can have fixed or rotating wings. Although most helicopters can yaw while hovering, as in Mode 2, some cannot reach a wide angle of inclination while hovering. These helicopters can use the maneuvers shown in figure 7 or figure 8. However, some helicopters can achieve a 45 ° lateral tilt angle by lateral acceleration from inactivity. They can capture images while moving quickly left and right without moving forward.
[0082] Fourth Mode. For any of the aforementioned modalities: instead of a camera, the aircraft can be equipped with two cameras aimed roughly parallel to each other, each camera with respective fixed polarization filters that are roughly oriented at 90 ° to each other . With this arrangement, a 45 ° turn, in lateral inclination, or yaw (depending on the orientation of the cameras) acquires all the parameters of linear Stokes in two maneuvers, instead of the three required in the previous modalities.
[0083] The fourth mode imposes extra weight and cost for an additional camera and filter, in addition to the only camera required for the first to third modes, but it provides some operational savings by using only two maneuvers instead of three. Compared to the previous technology solution with multiple cameras, this modality uses fewer cameras, thus saving weight and cost.
[0084] Fifth Mode. In the modality shown in figures 9 and 10, part of the arrangement of the focal plane 26 of the pixels inside the camera 16 is covered with a polarization filter 18a and part is discovered, so that the result is regular (non-polarized) images from this part of the focal plane arrangement 26. Figure 9 shows a camera configuration in which a polarization filter 18 overlaps a part of a focal plane arrangement 26 of the pixels inside a camera 16. The polarization filter 18 can be connected in the focal plane arrangement 26 using adhesive 25. The focal plane arrangement 26, in turn, is attached to the rear wall 30 of camera housing 16.
[0085] Figure 10 is a front view of the polarization filter 18 that overlaps part of an array of the focal plane 26. The orientation of the polarization filter 18 is indicated by a double-headed arrow in figure 10. The uncovered part of the Focal plane arrangement 26 measures the total intensity, which is one of the measurements used to compute Stokes parameters. It also provides a conventional image when polarimetry is not needed. The covered portion of the focal plane arrangement 26, together with aircraft maneuvering to aim this part of the focal plane arrangement 26 at a target and to orient the polarization filter 18 appropriately, provides intensity measurements in one or two polarization orientations.
[0086] Placing a uniform filter over part of a CCD focal plane arrangement is much more economical and easier than the previous technology solution of placing a particular filter orientation on each pixel. The previous technique requires that a piece of plastic or glass be attached with an accuracy of about 1 mm. The task can be done by hand, and it can be used to modify a camera already installed on an aircraft. The latter technique (technology above) roughly requires that a million individually oriented filters be positioned a fraction of the width of a pixel, for example, a micron or two. It requires precise electro-optical manufacturing systems and can plausibly be made only in one factory.
[0087] In an alternative mode (not shown), the non-polarizing part is covered with a neutral density optical filter that transmits about 50% of the incident light. Since a polarization filter transmits about 50% of the incident light when the scene is not polarized or only slightly polarized (as in most outdoor scenes), the 50% gray filter roughly corresponds to the transmittance of the polarizer. Matching the transmittance means that both sides of the CD image are approximately equally well exposed, which improves image usability and intensity resolution.
[0088] Sixth Mode. In the modality shown in figure 11, camera 16 is modified to have two polarization filters 18a and 18b with different orientations in front of respective halves of the focal plane arrangement 26, and covering them. The aircraft maneuvers to treat the target image in each section of the focal plane arrangement 26, instead of rotating around its optical geometric axis. This enables the measurement of several polarizations with one or a few small redirections of the aircraft, instead of multiple large maneuvers. With a camera facing forward, the configuration in figure 11 only needs a 45 ° roll, plus a small change of direction or pitch in order to take measurements at three different polarization angles.
[0089] The configuration shown in figure 12 comprises three linear polarization filters 18c - 18e with respective orientations and a circular polarization filter 18f in front of the respective quadrants of the focal plane arrangement 26, and covering them. This configuration typically requires only a degree or two of change of direction or pitch to make measurements at three or four polarization angles (that is, the aircraft does not need to roll). The circular polarization filter 18f can measure the complete Stokes vector in applications where circular polarization is significant. Alternatively, the quadrant of the focal plane arrangement 26 covered by the circular polarizing filter 18f may instead be covered by a neutral density filter to provide a measurement of unpolarized intensity.
[0090] Seventh modality. In another embodiment, the invention explores the presence of a needle-mounted camera on some aircraft. Figure 13 is a diagrammatic top view of a typical needle rocker mounted camera 16 with a lens unit 28. Figure 13A is a sectional view of the needle rocker camera shown in figure 13, the needle rocker being divided along a plane indicated by line 13A - 13A in figure 13. The needle rocker 32 has two mutually perpendicular geometric axes of rotation. Camera 16 can swing left and right around the needle balancer azimuth geometry axis and can rotate around the needle balancer elevation geometry axis to point lens unit 28 up and down. In this configuration, the azimuth and elevation geometric axes are perpendicular to the camera's optical geometric axis 16 and to each other.
[0091] According to a seventh modality shown in figures 14, 14A and 14B, a camera mounted on a needle rocker 16 is designed to facilitate changing the orientation of a polarization filter 18 that is attached to the lens unit 28 of the camera 16. As seen in figure 14, the camera 16 is mounted transversely on the needle rocker 32 in such a way that the anterior elevation axis is parallel to the optical geometric axis of the camera 16. In this configuration, the camera 16 can rotate around the anterior elevation geometric axis between first and second angular positions, causing the polarization filter 18 to oscillate up and down, as seen in figures 14A and 14B. For the purpose of illustration, the amount of rotation shown in figures 14A and 14B is considered to be 90 °. When the camera 16 is in the first angular position, the polarization filter 18 is oriented horizontally (seen in figure 14A); when the camera 16 is in the second angular position, the polarization filter 18 is oriented vertically (seen in figure 14B). In figures 14A and 14B, the straight double-headed arrows indicate the respective orientations of the polarization filter 18, while the double-pointed arrows indicate the curved path of the center of the polarization filter as the camera 16 rotates. between the first and second angular positions. The ability to change the orientation of the polarization filter 18 enables the camera 16 to provide images at various polarization angles. In this seventh embodiment, the geometrical axis of the previous elevation no longer aims the camera 16 up and down. The geometry axis of the azimuth continues to provide left - right pointing about half of the range that it had in a previous technology device. The aircraft's maneuvers provide pointing on other geometric axes.
[0092] Eighth Mode. Figure 15 shows a side view of an unmanned aerial vehicle 20 with a camera 16 mounted on a needle rocker 32 (partially shown) on a ball tower 34. For cameras mounted on a needle rocker that look out through a clear sphere tower 34 (or window), a part of sphere tower 34 (or window) can be covered by a polarization filter 18 to polarize a part of the camera’s field of view, as shown in figure 16. For conventional image treatment, the needle rocker 32 is used to aim the camera 16 outside the unfiltered part of the ball tower 34 (or window). For polarimetry, needle balance 32 is used to aim the camera 16 at the target (not shown in figure 16) and the aircraft is oriented to place the polarization filter 18 between the camera 16 and the target. If multiple orientations of the filter are necessary, the aircraft 20 performs maneuvers in the manner previously described for other modalities to orient the polarization filter 18.
[0093] It is understood that UAV operators rarely point the camera 16 through the lower rear part of the ball tower 34. Using this position for a polarization filter 18, therefore, will have minimal impact on ordinary operations, still enabling the acquisition of polarimetry data. As in the third mode, rolling the aircraft to the left or to the right changes the orientation of the filter.
[0094] Figure 16 shows the polarization filter 18 mounted inside the ball tower 34. In cases where this is not feasible, the polarization filter 18 can be mounted outside the ball tower 34 using an appropriate fairing to minimize the aerodynamic drag.
[0095] Another option (not shown in the drawings) is to mount the polarization filter 18 on one side of the ball tower 34, for example, the starboard side. So a UAV that circles a target counterclockwise on a left side tilt can acquire ordinary unpolarized images, but by circling the target clockwise on a right side tilt, the UAV can acquire polarized images. Viewing the target at various positions in the focal plane, along with changes in the UAV's pitch angle, allows polarization measurements in various orientations.
[0096] The modalities described up to this point operate on the principle of maneuvering a vehicle, so that the orientation of a polarization filter is varied during the image treatment of a target using a camera. Other modalities operate on a principle that exploits the optical properties of a camera without a dedicated polarization filter to determine the amount of polarized light in a scene. According to some modalities, a system and method are provided that determine the polarization of light from one or more objects in a scene without using a polarization filter. A series of images is acquired with a camera oriented at various angles so that objects appear in various positions in the focal plane of the camera. Light that strikes the lens at a non-perpendicular angle is partially reflected, with the reflected light being polarized parallel to the lens surface and the transmitted light being polarized perpendicular to the lens surface. Comparing images from the series, it is expected to see the intensity of each polarized object vary with the position of its image projected on the focal plane. This variation in intensity reveals the polarization of light from each object.
[0097] For a typical modality that uses a camera without a polarization filter, the system comprises: an aircraft; an onboard navigation and control system with the previously described capabilities; an on-board camera with a known orientation in relation to the aircraft; an on-board control system capable of controlling the camera to capture images when the aircraft arrives at the selected position with one of the selected orientations; a computer (on board or on the ground) programmed to record images and compute a target's polarization values according to stored data that represent a characterization of the camera's polarization energy; and device for transferring images from the camera to the computer.
[0098] Modalities that do not use a polarization filter employ devices and methods to characterize a polarization energy of the camera (specifically, two elements of its Mueller matrix) as a function of the angle, so that the camera can be used as described in the preceding paragraph. This characterization of the camera's polarization energy involves a polarized light source with a known polarization angle and degree (typically used in a laboratory or factory); a camera; a computer configured to receive images from the camera; and software on the computer for processing images generated with the polarized light source and the camera to determine the elements of the Mueller matrix that characterize the polarization energy of the camera.
[0099] The main steps in a process for acquiring polarimetry data using a camera without a polarization filter attached are as follows: (1) By measurement or by calculation, the polarization energy of the camera (ie, Mueller matrix) depending on the angle is determined. (2) After characterizing the polarization energy of the camera, a series of images from a target's camera are captured. The orientation of the camera is changed between successive images so that the target has an image treated at various points in the focal plane of the camera. For some applications, the camera is mounted on an aerial vehicle. The orientation of the camera is controlled by maneuvering the aerial vehicle. (3) The captured image data is then transferred from the camera to the computer. (4) The computer then processes the image data, using the camera's Mueller matrices to calculate the amount and angle of polarization in the light coming from the target.
[00100] Before describing several modalities that are based on the characterization of the polarization energy of a camera, further discussion of aspects of light polarization will be useful. Physicists and engineers describe the polarization of electromagnetic waves as having two orthogonal components corresponding to the directions in which the electric field oscillates. In strongly polarized radiation, one of these components is much stronger than the other. Natural sunlight is not polarized, that is, the two polarization components are of equal magnitude.
[00101] It is well known that partial reflection on a transparent surface can divide a beam of light into two beams, each of which is partially or completely polarized. This can be demonstrated by passing light through a stable sheet of glass at an oblique angle. Figure 17A shows no polarization when light hits a stable sheet 40 made of glass in perpendicular view. Figure 17B shows a stronger reflection of polarized light in s at an oblique angle, which increases the proportion of polarized light in p in the transmitted beam. Only the first reflection on the surface is shown. In reality, there is also reflection from the posterior surface. Figures 17A and 17B (and other figures) follow the common convention of naming the two polarizing components sep, each named as a mnemonic in relation to what they are parallel to: s is parallel to the surface and p is parallel to the plane of incidence. In the figures, polarized light in p is shown by arrows that indicate a vector in the plane of the page and polarized light in s is shown by circles that indicate a vector perpendicular to the page. The intensity of each polarization component is indicated by the length of each arrow or the diameter of each circle. The light reflected from each surface is mainly polarized in s when it does not collide at an angle of incidence close to 0 ° (the situation shown in figure 17B). The remaining light in the transmitted beam is somewhat depleted in the component if, therefore, it is slightly more polarized in p when the incident beam is not colliding at an angle of incidence close to 0 °. The ratio of the two components depends on the angle of incidence and the refractive index of the glass. The amplitudes of the coefficient for reflection and transmission of waves parallel and perpendicular to the surface can be calculated using the Fresnel equations. For any incident angle θi, Fresnel's equations appear as follows:
where ni is the index of refraction for incident media, nt is the index of refraction for transmitted media, θi is the incident angle and θt is the transmitted angle, which can be calculated using ni, nt, θi and Snell's law . Mueller Matrices As previously discussed in relation to Eqs. (1) - (4), Stokes parameters can be calculated based on the angles in relation to an optical element. A polarization filter, a camera lens or other optical element can transform polarized light in a way that can be described by a first Stokes vector to another way that can be described by a second Stokes vector. The most common way to describe this transformation mathematically is the Mueller calculation, in which the transformation is specified by a 4 x 4 matrix. The formalism looks like Eq. (9): S2 = M Si (9) in which Si is the first Stokes vector, M is the Mueller matrix of an optical element and S2 is the second Stokes vector. The Mueller matrix of a perfect horizontal polarization filter is as follows:
For a perfect vertical polarization filter, the matrix is:
Eq. (10) is an example that shows how Mueller's calculation works. An incoming (non-polarized) beam of light (represented at the far right of Eq. (10) by a vector Sl) with intensity 1 is polarized 45 ° upward to the right. It passes through a vertical polarization filter (represented by a Mueller matrix), which becomes a vertically polarized beam (represented on the left side of Eq. (10) by an S2 vector) with an intensity of ½

[00102] In the examples shown, a Mueller matrix describes the integrity of an optical element, for example, a polarization filter or a camera lens. In the case of a camera lens, the Mueller matrix depends on the angle of incidence θi at which a particular beam of light collides on the lens and the orientation angle ψ around the optical geometric axis. Therefore, this description sometimes refers to specific Mueller matrices such as M (θ) when only the angle of incidence matters, and at other times it refers to specific Mueller matrices such as M (θ, ψ), or some similarly specific term, when both parameters matter. Simple Qualitative Example
[00103] In a CCD camera, the lens focuses light that arrives at various points in an arrangement of the focal plane. Figures 18A through 18C are diagrams showing, respectively, different amounts of polarization by a lens 42 at different angles θ = 0, ~ 20 ° and -40 ° in relation to the optical geometric axis of a camera 16, corresponding to the respective different positions of the pixel. The angle at which the light reaches the lens 42 determines the position at which the light focuses on the arrangement of the focal plane 26.
[00104] Figure 19 shows different polarization of light that passes through a lens 42 at different angles, corresponding to different positions of an object image in a focal plane arrangement 26. The eccentricity of the ellipse shows the degree of polarization; the ellipse orientation shows the direction of the polarization. Since light that arrives at an angle of zero incidence is focused on the center of the focal plane arrangement 26 (see figure 18A), light focused on the center of the focal plane arrangement 26 incurs lens polarization 42 (see figure 19). Light that arrives at a large angle is focused near the edge of the focal plane arrangement 26 (see figure 19), so light that illuminates the edge of the focal plane arrangement 26 incurs maximum polarization of lens 42. As a result, lens 42 acts as a polarization filter: if the light arriving at camera 16 from the external scene is already polarized perpendicular to the lens polarization effect, then lens 42 reduces the light intensity. This means that the apparent intensity of a given object in the scene depends on (a) its real intensity, (b) its polarization and (c) its position in the focal plane.
[00105] Figures 19A through 19C show different intensities of an object in the scene at different pixel positions that reveal its polarization (line width indicates intensity) for vertically polarized light (figure 19A), horizontally polarized light (figure 19B) and non-polarized light (figure 19C). As seen in figure 19A, a vertically polarized object remains bright when it appears near a vertical line through the center of the focal plane, but it becomes cloudy in an arc to the right or left from the center. A horizontally polarized object remains bright when it appears in an arc near a horizontal line through the center of the focal plane, but it becomes cloudy in an arc above or below the center (see figure 19B). As seen in figure 19C, the intensity of an unpolarized object fades with distance from the center of the focal plane, regardless of the object's direction from the center.
[00106] The exposed discussion was based on light with variable polarization and variable arrival angle that interacts with a fixed camera and its focal plane. Alternatively, one can think about the light that collides with fixed polarization in fixed coordinates, for example, moving in the x direction, at the same time that the orientation of the camera changes. Figures 20A - 20C illustrate this approach.
[00107] Figure 20A shows a sectional view of a non-tilted lens 42a (zero angle of incidence) and a lens 42b tilted at a non-zero angle θ. These lenses are divided along a plane indicated by line 20A - 20A in figure 20B. Figure 20B shows frontal views of the lenses 42a, 42b represented in figure 20A and other lenses 42c, 42d that are inclined by the same angle θ with respect to the incoming light, but they incline at different angles of orientation θ around the optical geometric axis . Figure 20C represents object images projected onto a focal plane array 26 by the lenses represented in figure 20B, whereas the focal plane arrangement 26 is parallel to each lens and centered on the optical geometric axis of the lens as in a typical camera. The angle ψ corresponds to the angular position around the center of the focal plane. Light that passes through the lenses in these orientations focuses at different points in the arrangement of the focal plane 26. Therefore, a ψ coordinate of the pixel in the focal plane corresponds to the orientation of the light in relation to the lens surface when the light has passed through the lens. This affects the relative amount of polarization s and polarization p in the incoming polarized light. Horizontally polarized incoming light that focuses on a location with ψ = 0 is polarized in p with respect to a lens. Horizontally polarized incoming light that focuses on a location with ψ = 90 ° is polarized in s. According to some modalities, the aerial vehicle can be maneuvered in such a way that the camera is oriented at different angles so that light from a single target is focused on points with different values. Complications
[00108] The optical path from a target to a camera's CCD sensor imposes additional complications that must be considered if the camera needs to be characterized by a correct Mueller matrix.
[00109] Curved lens. With a stable sheet of glass, collimated light that arrives reaches each point on the surface at the same angle and, therefore, each point on the surface polarizes the beam transmitted to the same degree as each other point (see figure 17B). A camera lens has a curved surface, so a collimated beam of light does not reach the lens at the same angle over the entire surface. There is, therefore, a slightly variable degree of polarization for light transmitted through various points on lens 42 (see figure 21 A). However, for a lens with circular symmetry (that is, almost all of it) and a target close to the center of the image, polarization incurred at any point A on the lens is almost canceled by polarization of opposite light and almost equal at a point B equidistant from the center of the lens and 90 ° around the geometric axis from point A. Therefore, the liquid effect is similar to that of a stable glass sheet: light that arrives parallel to the geometric axis of the lens (that is, roughly perpendicular to the lens surface ) and focused on a point close to the center of a focal plane arrangement collectively incurs the polarization that passes through the lens; but light that arrives at a substantial angle to the geometric axis of the lens and focused far away from the center of the focal plane arrangement collectively incurs stronger polarization (see figure 21B). The degree of polarization varies slightly across the lens surface; only the central beam is shown in figure 21B.
[00110] A narrow aperture of the camera minimizes the effect of a curved lens surface: the lens curves too little over the area of a small aperture. A wide aperture increases non-cancellation differences between widely separated parallel paths through the lens. Therefore, some modalities include aperture width as a parameter, as well as 0, in determining Mueller matrices for the camera.
[00111] Multiple lenses. So far, this description has treated a lens as a single piece of glass with reflections on the front and rear surfaces. Typically, any camera, including aerial surveillance cameras and the now commonly used point-and-shoot and single-lens reflex cameras, will have multiple lenses combined into a single lens unit of the camera. Each lens consists of lens elements. Some are cemented together; others are not, instead, having ar - lens interfaces. Multiple lens elements are used to control aberrations and provide a clear image. Partial reflections can occur at each interface, increasing the degree of polarization for light paths outside the geometric axis. For example, figure 22 shows a basic arrangement of an internal focus lens system comprising a first fixed lens group 50, a second lens group 52 to perform an approach operation, an iris stop 54, a third group fixed lens 56, a fourth group of lens 58 (referred to as a focus lens) with both a focusing function and a so-called compensator function to compensate for the movement of a focal plane caused by the approach; and an image perception device, such as an arrangement of the focal plane 26. To reduce image artifacts, such as flame, and to increase the amount of transmitted light, lens manufacturers typically cover elements with anti-reflective coatings, possibly consisting of multiple layers and typically being more effective at some wavelengths than at others. These reduce, but do not eliminate, the polarization added at each filter interface.
[00112] Focal plane optics. Once through the camera lens, the light falls into the focal plane arrangement, which is typically a CCD detector. Like the lens, the CCD can also increase polarization. The CCD is a multilayer device that not only collects light by converting photons into an electrical signal, but also typically filters the light through an infrared filter and a color filter arrangement (often a Bayef filter). Figure 23 shows a sectional view of a typical CCD unit cell. The unit cell comprises a sensor 60, a color filter arrangement 62 and a microlens on the chip 64 (A monochrome device does not have color filters as part of the CCD.).
[00113] As seen in figure 23, light can find the microlens on chip 64 used to maximize light collection and direct it to sensor 60. The light then passes through a color filter arrangement 62. Typically, the color filter arrangement will be a Bayer filter made up of red, green and blue color filters standardized by the chip. At each interface through microlens 64 and color filters 62, some reflection occurs. The further outside the geometric axis, the more this reflection increases the polarization.
[00114] Surface of the sensor. Another partial reflection occurs on the filter surface up to the sensor, slightly increasing the polarization even further. Methods to characterize a camera
[00115] Computational method. One method to characterize a camera, that is, to determine its Mueller matrices for several different angles of incidence, is to import a detailed geometric and material model of each lens element, coatings, adhesives and optics of the focal plane into the analysis software. optics. The software calculates the polarization of light that arrives at each point in the focal plane arrangement. This method is not unprecedented.
[00116] Experimental method. A second method is to make a series of intensity measurements and calculations using an experimental method. This method is easier than the computational method, because all the complications described above are automatically considered. This experimental method comprises the following steps.
[00117] First, configure the camera, including the selected lens, in a controlled optical environment. The setup typically includes an optical target, a polarizer that can be rotated to a selected orientation, and a device to rotate the camera around at least one geometry axis to take pictures in which the target appears at various known angles outside the axis geometric. An example of an experimental setup like this is shown in figure 24. A CCD camera 16 with a focal plane arrangement 26 of sensors and a lens unit 28 is mounted on a panning mechanism - tilt 68. The lens unit 28 of camera 16 is aimed at a light source 66 with a polarization filter 18 disposed between them. Light source 66 emits unpolarized light that is filtered through the polarization filter to produce waves of polarized light that collide in lens unit 28 (Although figure 24 shows light source 66 emitting light directly towards camera 16, in the experiment real reported below, the camera received light after it had been emitted towards a sheet of white paper by a light source and then reflected towards the camera by white paper.).
[00118] After setting, images of the target are captured in various camera positions and filter orientations. The Mueller matrix M (θ) is different at each angle outside the geometric axis θ. Therefore, measurements of the image must be made both (a) at each angle θi for which a precise Mueller matrix is desired and (b) at angles spaced close enough to allow sufficiently accurate interpolation. At least two images must be captured at each angle θi; as described below, typically three images are used to improve accuracy.
[00119] The range of angles θi at which images are captured varies with the camera and the application. A single reference image should be captured at angle θ = 0. Since a camera has the strongest polarization effect at the highest incident angle, a typical application captures images at angles from the reference image at θ = 0 to the largest possible incident angle, that is, a position with the target furthest from the center of the image. In most cases, this places the target in a corner of a rectangular image. For computational simplicity, some modalities use images with the target in the middle of a border in a rectangular image, even though it is not as far from the center of the image as a corner.
[00120] The next few sections describe conditions for capturing the reference image at θ = 0 and a set of images at some value of θi ^ O. Reference Image: On the Geometric Axis, that is, 0 = 0
[00121] The reference image uses an incident angle of 0 °, that is, the target is centered on the image. In this angle, the lens and optics of the focal plane are treated as an ideal clear filter (They are not, but this is not discernible, unless a better camera or other instrument is used.). The corresponding Mueller matrix is the identity matrix, as shown in Eq. (11):
The target emits and / or scatters non-polarized light, which is described by the Stokes vector inserted at the right end of Eq. (11).
[00122] Each captured image is an array of pixel values Pj. Each pixel value is proportional to the intensity of light that strikes a corresponding point in the target scene. The pixel value Po measured at the target point in the reference image defines the reference intensity Io, as shown in Eq. (12).

[00123] Non-unit coefficients for Mueller matrices corresponding to other angles θi indicate changes in relation to this reference intensity. Images at 0 = θi, ψ = 0
[00124] All images taken at each incident angle θ = θi use the same rotation angle ψ. The rotation angle ψ is defined by the plane that contains the target, the center of the lens and the point on the lens that is farthest from the target (see figure 20C). The rotation angle ψ defines the coordinate system for polarization, that is, all characterization images are defined as ψ = 0. Polarized light at ψ = 0 is defined as horizontally polarized, that is, Q = 1 in the coordinates chosen. This definition makes the horizontally polarized light equal to the polarized light in p for the lens and vertically polarized equal to the polarized light in s. Non-Polarized Target
[00125] One of the images in θ = θi visualizes the target with non-polarized light, that is, no polarization filter is used next to the camera itself. This corresponds to the input Stokes vector shown at the right end of Eq. (13):

[00126] The image includes the measured pixel value Pθunp on the target. The light reaching the camera from the target has the same intensity as the reference image, but the measured pixel value is different, so the element of Mueller An's matrix for θ = θi is calculated using Eq. (14) and measured values Po and PθunP:
Horizontally Polarized Target (Polarized in p)
[00127] One of the images in θ = θi can visualize the target with light that has passed through a horizontal polarizer. (Since the camera is oriented with ψ = 0, horizontal polarization is equal to p polarization.) This corresponds to the input Stokes vector at the right end in Eq. (15), including the effect of the filter at full intensity (For a ideal polarization filter, the intensity coefficient is 1/2, as shown. One of the methods described below measures the actual coefficient.).

[00128] The image includes the measured pixel value Pθp on the target. From Eq. (15), we see that the element of Mueller An's matrix for θ = θi is related to the pixel value by Eq. (16):

[00129] This equation can be rearranged to acquire An, as shown in Eq. (17).

[00130] Vertically Polarized Target (Polarized in s)
[00131] One of the images at θ = θi can view the target with light that has passed through a vertical polarizer (Since the camera is oriented with ψ = 0, vertical polarization is equal to s polarization.) This corresponds to the Stokes vector at the right end in Eq. (18).

[00132] The image includes the pixel value measured Pθs on the target. From Eq. (18), we see that the element of Mueller An's matrix for θ = θi is related to the pixel value by Eq. (19):

[00133] This equation can be rearranged to acquire An, as shown in Eq. (20):

[00134] Use Polarized Images in s and p to Derive Estimation An Average
[00135] Some modalities use the images both horizontally polarized and vertically polarized above. These modalities combine data to reduce the effect of noise and thus improve the estimate of An. In these modalities, Eq. (17) is added to Eq. (20) and the sum is divided by 2 to compute an average estimate of A12: A12 = Iθp - Iθs (21)
[00136] Using both horizontally and vertically polarized images produces another advantage: an estimate for the intensity coefficient of the polarization filter used in these measurements. The intensity coefficient describes which fraction of unpolarized light passes through a filter. As stated, an ideal polarization filter has an intensity coefficient of 1/2. For a real polarization filter, the intensity coefficient can be computed as the average fraction of light that passes through the filter in any two perpendicular polarizations, for example, polarized in if polarized in p. The pixel intensity for unfiltered light has already been measured as Iθunp, as seen in Eq. (14). Therefore, the filter intensity coefficient can be computed as:

[00137] This value replaces the coefficient 1/2 in Eqs. (15) and (18), leading to better numerical coefficients in Eqs. (17), (20) and (21) to estimate A12. Mueller Matrix Measurement Example
[00138] The aforementioned method was used to characterize a Canon EOS Rebel 300D camera with a Canon EFS 18-55 mm approach lens set at a focal length of 18 mm and a clear filter in front of the lens. The light source was a sheet of white printer paper illuminated by a fluorescent desk lamp. The white printer paper had a target symbol in the shape of a cross drawn on it (a part of this target symbol is shown in figure 25). Images were saved in 8-bit JPEG format. This level of quality is enough to show viability. In a more rigorous characterization, the camera can be configured to produce images in 12-bit RAW format, which provides a higher resolution and does not introduce compression errors.
[00139] The characterization method that will now be described uses four pixel settings for each calculation, but this is simply a weighting technique to reduce noise - this is not required. The most general approach is based on individual pixel values.
[00140] First, a reference image on the geometric axis was captured with the target in the center of the image. Light from the target was unpolarized. Figure 25 shows an approximate image. MATLAB was used to mark the target pixel and measure its RGB values. The target pixel was located in column 1,536 (X) and line 1,024 (Y) of the pixel image. The target pixel measured R, G and B values of 232, 181 and 124, respectively. The four pixels adjacent to the target pixel measured the average of the R, G and B values of 237.25, 182.5 and 127.5, respectively.
[00141] Then, images outside the geometric axis were captured with the target close to the right edge of the image and still without a polarization filter. The target angle from the center of the image was θ = θ0 = 28.3 °. In this instance, the target pixel was located in column 2,850 (X) and line 1,024 (Y) of the pixel image. The average R, G and B values measured for the four pixels adjacent to this target pixel were now 209.75, 167.5 and 115.25, respectively.
[00142] Subsequently, a polarizing filter (a lens from polarized sunglasses) was placed between the target and the camera (although the angle of the target from the center of the image was still θ = θ0 = 28.3 °) . First, the polarization filter was oriented so that the light from the target was horizontally polarized, that is, polarized in p, in relation to the lens and optics of the focal plane. In this case, the four pixels adjacent to the target pixel (that is, X: 2,850; Y: 1,024) measured the mean of the R, G and B values of 118, 82 and 44.25, respectively. Then, the polarization filter was oriented so that the light from the target was vertically polarized, which is polarized in s with respect to the lens and optics of the focal plane. In this case, the four pixels adjacent to the target pixel measured the average of the R, G and B values of 104.75, 80.75 and 34.75, respectively. As expected, the polarized values in p are higher than the polarized values in s: each interface on the camera reflects more polarized light in s than polarized light in p.

[00143] The following table shows an example of calculating the first two elements of the Mueller matrix from the measurements described above.
[00144] The line "target point" in the table specifies the coordinates of pixel outside the geometric axis of the visual target (the center on the geometric axis of the focal plane was at 1.536 and 1.024). Each set of measurements in the table comprises four pixels diagonally adjacent to the target pixel. The columns labeled R, G and B show pixel values measured for each color; the "Intensity" column is the average of these values. The mean and median for each color and intensity are shown on the two lines immediately below each data set. The third line from the bottom shows the first Mueller element (An) calculated from the average pixel value for each color and for the total intensity. The baseline shows the second Mueller element (AJ2) for each.
[00145] The data show a relatively strong intensity ratio for horizontal to vertical polarization in the red and blue bands, but a relatively weak ratio in the green band. This is probably because the coating on the anti-reflective lens is optimized to reduce reflection in green light, the band to which human visual perception is most sensitive. Since the methodology described here is based on uneven reflection to induce polarization, minimum reflection in the green band corresponds to minimum polarization in the same band. The second Mueller element in the green band is shaded to indicate that polarization measurements in the green band at this value of 0 may not be reliable.
[00146] The example includes calculations of the intensity coefficient of the polarization filter as in Eq. (22). A coefficient for each color is shown on the line in the table labeled "neutral polarizer density".
[00147] The camera used in the exposed experiment had a focal plane arrangement of 22.7 mm wide and 15.1 mm high. The focal length of the lens was set at 18 mm. The target point was 28.3 ° horizontally in relation to the center, that is, 0 = 28.3 °.
[00148] In the experiment described above, measurements for a single value of 0 were taken. A typical application uses similar measurements at multiple values of 0. The resulting Mueller matrix values are stored in a lookup table or are used to curve fit an equation that can be used to estimate the Mueller matrix values at any These Mueller matrices are a characterization of the polarization energy of the camera. Method for Using a Featured Camera
[00149] Once the polarization energy of a camera has been characterized, this camera can be used to capture images from scenes that contain unknown targets. A method according to a modality comprises the following steps: Step 1: Configure Camera.
[00150] The camera configuration includes installing the camera and related gear in such a way that the camera can view the same target object with different lens orientations, which typically correspond to different positions on the CCD. This may include attaching the camera to a pan / tilt mount, for example, in a factory, or attaching it to an aircraft or other vehicle. Step 2: Capture images.
[00151] Capture multiple (for example, three) images using the same θ incidence angle and several camera orientation angles (for example, ψ = 0 °, 45 ° and 90 °). For each image captured in different orientations, the lens will project an image of a point on the target over a corresponding position on the CCD chip. The goal is to capture the same scene with similar incidence angles (for example, 30 ° from the center), but different tilt orientations. In cases where the camera does not rotate around the camera's geometric axis to the target, this objective is equivalent to capturing the scene in different parts of the CCD that are at the same angle from the center. To measure the first three Stokes parameters (which completely characterize the linear polarization), the target has an image treated in different angular positions around the geometric optical axis, ideally at 0 °, 90 ° and both 45 ° and 135 °.
[00152] The camera can be mounted on a pan / tilt mount mechanism. In a factory application, a typical modality uses an automated pan / tilt assembly or needle balancer to orient the camera as described above while images are acquired. In a typical aerial application with a needle balance assembly, the needle balance guides the camera while images are being acquired. In aerial cases without a needle balance or when the needle balance is inefficient, the flight control operator or computer maneuvers the aircraft, spacecraft or other vehicle to orient the camera at different angles for image acquisition.
[00153] Ninth Mode. As previously discussed, figures 4A and 4B are top and side views of an idealized fixed-wing aircraft 20 with a single downward facing camera 16 fixedly mounted on it and a polarization filter 18 attached to the camera. In the event that the polarization energy of the camera has been characterized as exposed, the polarization filter can be omitted. In this case, in the configuration shown in figures 4A and 4B, the camera 16 faces downwards when the aircraft is in level flight.
[00154] Figure 26 shows a flight path for a fixed wing aircraft 20 of the type shown in figures 4A and 4B, except where the polarization filter has been omitted. When a featured camera is on board, aircraft 20 can acquire polarimetric data from a target 22 by making three steep turns at the same angle of lateral tilt θo (For a camera pointing downwards on an aircraft in level flight, the tilt angle lateral is identical to the incidence angle θ.). The successive turns are indicated in figure 26 by circled numbers 1, 2 and 3, respectively. The camera captures an image in the same position directly above the target during each of the three turns. On lap number 1, the camera is tilted to the west when it captures the image; on lap number 2, the camera is tilted to the northwest when it captures the image; and, on lap number 3, it is tilted north when it captures the image. These correspond to θ = 0 °, 45 ° and 90 °.
[00155] An airplane with a fixed camera facing forward can acquire images at θ = -45 °, 0 ° and 45 ° by yawing and tilting up or down momentarily about 32 ° while taking photographs. An airplane with a camera facing the fixed side can acquire images at θ = -45 °, 0 ° and 45 0 by yawing and scrolling left and right about 32 ° while taking photographs.
[00156] Tenth Mode. A hovering helicopter can acquire a similar set of images, with the aircraft and camera tilted at the same angles, without leaving its position above the target. Instead, the helicopter can roll and gasp by moving side by side or back and forth.
[00157] Eleventh Mode. A spacecraft can acquire similar images by reorienting it in successive orbital passages above the target. Step 3: Compare Images and Determine Polarization
[00158] In the case of aerial vehicles, the captured images can be transmitted via a wireless communication channel to an antenna on the ground for processing by a computer on the ground or the captured images can be transferred directly to an on-board computer. The images are transferred to a computer that uses the intensity measured as a function of position to determine the Stokes Sx vector of the scene:

[00159] The coordinate system of this vector is discussed below.
[00160] According to a known technique, Stokes parameters can be measured at the system output, for example, Qout = Into - Intyo, Uout = Intis - Inins. You can concatenate them into an output Stokes vector, multiply this vector by the inverse of the Mueller matrix and acquire the Input Sx Stokes vector of the light coming from the target.
[00161] The technique for acquiring the polarimetric data described here works differently. The camera can only measure total intensity 1 (0, ψ), not Q and U, in each orientation. Therefore, a different method is used to compute Sx. For the discussion of the method presented below, it will be considered that all three images are taken with the same angle of incidence θ0, but the orientation of 0 around the geometric axis to the target varies in multiples of 45 °. Also, a hypothetical beam of light that is 100% horizontally polarized in a selected reference system will be referred to. No such light really exists; it is introduced for the purpose of helping the reader to understand the coordinate system and the effect of camera reorientation.θ = θo, Define ψ = 0: Q = 1
[00162] An image is selected to define a reference system, such that ψ = 0 for the selected image. Hypothetical light that is 100% horizontally polarized in this reference system has Stokes parameter Q = 1 and Stokes parameter U = 0, this hypothetical light is polarized in p in relation to the center of the lens. This reference system defines the input Stokes vector as in Eq. (23), and it determines the intensity measured in the image as in Eq. (24):

[00163] This relates Iθp to An and A12, like this:
θ = θ0, ψ = 90 °: H-in / p-in becomes V-in / s-in: Q = -1
[00164] A second image is used when the camera is tilted to reach the same angle of incidence θo for the target, but the camera is rotated to ψ = 90 ° around the geometric axis to the target, measured in the same reference system as the image with ψ = 0 °. Regarding the lens' ability to transmit light, the input polarization rotated by 90 °. Hypothetical light that is 100% horizontally polarized in the reference system has Stokes parameter Q = -1 and Stokes parameter U = 0. This has the effect of exchanging horizontal and vertical polarization, so the effective Stokes vector in the camera is as shown at the right end in Eq. (26):

[00165] This relates Iθs to An and A12 like this:
θ = θ0, ψ = 45 °: 45 ° -pol becomes H-pol / p-pol: U = 1
[00166] A third image is used when the camera is tilted to reach the same angle of incidence θo for the target, but the camera is rotated to ψ = 45 0 around the geometric axis to the target. Regarding the lens' ability to transmit light, the input polarization rotated by 45 °. Hypothetical light that is 100% horizontally polarized in the reference system has Stokes parameter Q = 0 and Stokes parameter U = 1. This has the effect of exchanging H-pol and 45 ° -pol (Q and U, respectively, in Stokes' vector) , then, the effective Stokes vector in the camera is shown at the right end of Eq. (28):

[00167] This relates Iθ45 to Au and An like this:

[00168] Resolution for the Input Stokes Vector
[00169] Given measurements of Iθp, Iθs and Iθ45, the computer system programmed to process polarimetric data now has three equations (ie, Eqs. (25), (27) and (29)) in three Ix, Qx and Unknown Ux. Equation (25) can be added in Eq. (27) to produce: Iθp + Iθs = 2AiiIx (30) which is rearranged to acquire Ix:

[00170] You can replace this in Eq. (30) to acquire Eq. (32):
which can be rearranged to acquire Qx as in Eq. (33):

[00171] You can also replace Eq. (31) in Eq. (29) to purchase Eq. (34):
which can be rearranged to get Ux as follows:

[00172] This provides the complete Stokes vector of three elements Sx defined in Eq. (23).
[00173] Using the exposed equations, Stokes parameters can be calculated for each color (R, G, B) and for total intensity.
[00174] The exposed method was applied using the measurements from the calibration example. These measurements did not include any image with ψ = 45 °, so the Ux component may not be calculated, but the other calculations confirmed that the process described above leads to the correct values of Ix and Qx, that is, 1 and 1, the values reference numbers used for calibration.
[00175] Figure 27 is a block diagram that identifies the main components of a system to acquire polarization values for a target with treated image 22 according to a modality. The system comprises: an aircraft 20; an on-board navigation and control system 70 capable of flying to a three-dimensional position (for example, longitude, latitude and altitude) and subsequently returning the aircraft to approximately the same position at least twice and also capable of measuring the orientation of the aircraft in position and set the aircraft in a different selected orientation when it resumes to the same position; an on-board camera 16 mounted on a needle rocker 32; actuators 74 coupled to the needle rocker 32 to change the orientation of the camera in relation to the aircraft 20; an on-board linear polarization filter 18 with a fixed orientation known relative to the camera 16; a control system on board camera 72 capable of controlling actuators 74 to orient camera 16 to any of a plurality of selected orientations, control camera 16 to capture images when the aircraft arrives at the selected position with one of the selected orientations, and , then, receive the image processing data from the camera 16; an on-board transmitter 76 coupled to the camera control system 72 to transmit image processing data to a ground station; a receiver 78 at the ground station for receiving the transmitted imaging data; and an image processing data processing computer 80 (on the ground) programmed to record the images and compute polarization values for the treated image target 22.
[00176] The control system of the camera 72 can comprise a computer with hardware and software. The camera's control software comprises: a database containing target position information; a first program to control actuators 74 to change the state of the needle rocker 32 and then activate camera 16 depending on the aircraft's current position information (ie current aircraft position and orientation) received from the navigation and flight control 70 during the mission of data acquisition and stored position information of the target; and a second program for receiving image processing data from camera 16 and transmitting it in a format suitable for transfer by transmitter 76.
[00177] The image processing data processing computer 80 can also comprise hardware and software. The image processing data processing software comprises a first program to register the captured images and a second program to compute polarization values for the target with the treated image 22.
[00178] Alternatively, the camera 16 can be fixedly mounted on the aircraft 20, thereby eliminating the need for the needle rocker 32 and the actuators 74. According to additional alternative modalities, the polarization filter 18 can be omitted and / or computer 80 can be located on board aircraft 20 (in which case transmitter 76 will also transmit processed data to the station on the ground).
[00179] Additional Modalities. The polarimetric data acquisition system can be incorporated in many ways. Additional examples include at least the following. (1) Characterize the Mueller matrix of the camera, not only as a function of the angle, but also as a function of the aperture. A relatively large aperture allows light to pass through sections of the lens surface at different angles of incidence. (2) Characterize a camera's CCD separately from its lenses, so that users can combine CCDs and lenses in various ways without characterizing each combination. Two optical elements used in series, such as a lens and a CCD, are mathematically represented by successive matrix multiplication using their Mueller matrices, for example, S2 = MCCD (Miens Si). If both Mueller matrices are characterized separately, then the input Stokes vector is calculated by inverting both matrices and multiplying them in reverse order: Si = Miens'1 (MCCD'1 S2). (3) Capture images using angles ψ that are not integral multiples of 45 0 and / or angles ψ that vary between images. These modalities are based on more boring and more complicated algebra than the approach described by Eqs. (28) to (35), but the derivation and method will be clear to those versed in the technique that learned the exposed precepts. (4) Apply the exposed modality (using angles ψ other than 0 0/45 0/90 0 and angle values θ not identical) to calculate the input Stokes vector Sx for multiple points of pixel size in a scene (possibly , each pixel-sized dot in the scene) using as few as three images covering the scene. This produces a complete polarimetry image - degree and angle of polarization at each point in the scene - without a filter. (5) Attach the camera to a UAV, manned aircraft, helicopter, spaceship, surface ship or UUV. (6) Use a camera and lens that work in the ultraviolet, visual, infrared or terahertz bands.
[00180] Additionally, the description comprises modalities according to the following clauses: Clause 1. A method for determining a polarization of a scene, comprising: (a) placing a linear polarization filter in a field of view of a camera comprising a lens and a sensor array; (b) successively locate the camera and the linear polarization filter in proximity to a single position, but in three different orientations for each of which a scene is in the camera's field of view; (c) capture first to third filtered images, at the same time that the camera and the linear polarization filter are in the three different orientations, respectively; (d) transferring up to third sets of image processing data that represent, respectively, the first to the third filtered images from the camera to a computer system; and (e) compute a polarization of at least one point in the scene from the first to the third sets of imaging data. Clause 2. The method, according to clause 1, which additionally comprises mounting the camera and the linear polarization filter in a vehicle, in which step (b) comprises maneuvering the vehicle. Clause 3. The method, according to clause 2, in which the vehicle is an unmanned vehicle. Clause 4. The method, according to clause 1, in which respective angles around a line of sight of the camera with respect to a reference to the first two of the three different orientations differ by an odd integral multiple of 45 ° and respective angles around the camera's line of sight with respect to a reference to the second two of the three different orientations differ by 90 °. Clause 5. The method, according to clause 1, which additionally comprises first recording up to third sets of imaging data in relation to each other before performing step (e). Clause 6. The method, according to clause 1, in which step (e) comprises computing Stokes parameters. Clause 7. A system for acquiring images of a scene, which comprises: an unmanned vehicle; a camera on board said unmanned vehicle, said camera comprising a lens and an array of sensors; a first linear polarization filter disposed in front of at least a first part of said sensor array; a unmanned vehicle control system capable of controlling said unmanned vehicle to perform maneuvers, said unmanned vehicle control system comprising hardware and software, said software of said unmanned vehicle control system being configured to control said unmanned vehicle to position itself in a specified position, or close to it, for each of the first, second and third occurrences and in the first, second and third orientations that are different from each other, but which, each one, put the scene in a field of view of said camera; and a camera control system arranged on board said unmanned vehicle and capable of controlling said camera to capture images, said camera control system comprising hardware and software, said software of said camera control system being configured to control said camera to capture first, second and third images of a target scene during said first, second and third occurrences, respectively, and then transmit first, second and third sets of image processing data that represent, respectively , said first, second and third images. Clause 8. The system, according to clause 7, which further comprises an image processing data processing system capable of processing image processing data, said image processing data processing system comprising hardware and software said software of said image processing data processing system being configured to record said first, second and third sets of image processing data in relation to each other and to compute polarization values for the treated image scene. Clause 9. The system, according to clause 8, in which said polarization values comprise Stokes parameters. Clause 10. The system, according to clause 7, wherein said unmanned vehicle comprises a window, additionally comprising a needle rocker mounted on said unmanned vehicle, said camera being coupled to said needle rocker and said linear polarization filter being attached to said window. Clause 11. The system, according to clause 7, which further comprises a needle rocker mounted on said unmanned vehicle, wherein said camera is rotatably coupled to said needle rocker for rotation around a geometric axis which is parallel to an optical geometric axis of the camera and said linear polarization filter is attached to said camera. Clause 12. The system, according to clause 8, in which respective angles around a line of sight of said camera in relation to a reference for at least two of said first to third orientations differ by an integral multiple of 45 ° . Clause 13. The system, according to clause 7, which additionally comprises a second linear polarization filter disposed in front of a second part of said sensor array, in which one of the first and second linear polarization filters is horizontally polarized and the other of said first and second linear polarization filters is vertically polarized. Clause 14. A method for determining a polarization of a scene, comprising: (f) characterizing a camera's polarizing energy comprising a lens and a sensor array; (g) successively locate the camera in proximity to a single position, but in three different orientations for each of which a scene is in a field of view of the camera; (h) capture first to third images while the camera is in three different orientations, respectively; (i) transfer first, second and third image processing data sets representing the first to third captured camera images to a computer system; and (j) compute a polarization of at least one point in the scene of the first, second and third sets of image processing data. Clause 15. The method, according to clause 14, in which step (a) comprises determining first and second elements of Mueller's matrix. Clause 16. The method, according to clause 14, wherein step (a) comprises determining at least one of a first element of the Mueller matrix and a second element of the Mueller matrix for at least two positions in the sensor array , said positions corresponding to different angles of incidence for light passing through the center of the lens. Clause 17. The method, according to clause 14, in which at least one of the three different orientations is chosen in such a way that the wave falls into a position close to an edge or corner of the sensor array. Clause 18. The method, according to clause 14, which further comprises mounting the camera on a vehicle, in which step (b) comprises maneuvering the vehicle. Clause 19. The method, according to clause 14, in which respective angles around a line of sight of the camera in relation to a reference for at least two of the three different orientations differ by an integral multiple of 45 °. Clause 20. The method, according to clause 14, which further comprises first recording up to third sets of imaging data in relation to each other before carrying out step (e). Clause 21. A system for acquiring images of a scene, which comprises: an unmanned vehicle; a camera on board said unmanned vehicle, said camera comprising a lens and an array of sensors; a unmanned vehicle control system capable of controlling said unmanned vehicle to perform maneuvers, said unmanned vehicle control system comprising hardware and software, said software of said unmanned vehicle control system being configured to control said unmanned vehicle to position itself in a specified position, or close to it, for each of the first, second and third occurrences and in the first, second and third orientations that are different from each other, but which, each one, put the scene in a field of view of said camera; and a camera control system arranged on board said unmanned vehicle and capable of controlling said camera to capture images, said camera control system comprising hardware and software, said software of said camera control system being configured to controlling said camera to capture first, second and third images of a target scene during said first, second and third occurrences, respectively, and then transmitting first, second and third sets of image processing data that represent, respectively, said first, second and third images. Clause 22. The system, according to clause 21, which further comprises an image processing data processing system capable of processing image processing data, said image processing data processing system comprising hardware and software said software of said image processing data processing system being configured to record said first, second and third sets of image processing data in relation to each other and to compute polarization values for the image-treated scene based, in part, on the stored data that represents a characterization of a polarization energy of the camera. Clause 23. A method for measuring polarization in light from a scene, which comprises: (a) capturing successive images of a scene using a camera positioned close to a single position and oriented at successive different angles of orientation, in which a set matrices that characterize a polarization energy of the camera at different angles of incidence and different angles of orientation are known and there is no polarization filter between an array of camera sensors and the scene; (b) recording the captured images in relation to each other; and (c) compute polarimetry values for light from at least one point of interest in the scene based on the captured images recorded and a plurality of known matrices, in which steps (b) and (c) are performed using a computer comprising hardware and software. Clause 24. The method, according to clause 23, in which the matrix is a Mueller matrix and the computed polarimetry values are Stokes parameters. Clause 25. The method, according to clause 23, in which the polarimetry values that are included comprise polarization intensity and angle. Clause 26. The method, according to clause 23, in which the scene has an image treated in three different angles of orientation around an optical geometric axis of the camera, said different angles of orientation being arranged in angular intervals of 45 degrees . Clause 27. The method, in accordance with clause 23, which further comprises mounting the camera on a vehicle and maneuvering the vehicle to achieve the camera's different orientations. Clause 28. An empirical method to characterize a camera's polarizing energy with a lens and an arrangement of the focal plane of the sensors at a specified angle of incidence of the colliding light and a specified orientation angle, the method comprising: (a) provide a target that emits unpolarized light; (b) aim the camera at the target without an intervening polarization filter and with a part of the target projected on at least one sensor in a center of the focal plane arrangement; (c) capture a reference image while the camera is in the state described in step (b); (d) measuring a reference pixel value for a pixel in the reference image that corresponds to a sensor in the center of the focal plane array; (e) aim the camera at the target without an intervening polarization filter and with a part of the target projected on at least one sensor near an edge or corner of the focal plane arrangement; (f) capture a first image while the camera is in the state described in step (e); (g) measure a first pixel value for a pixel in the first image that corresponds to a sensor close to the edge or corner of the focal plane arrangement; (h) place a linear polarization filter between the camera and the target; (i) capture a second image while the camera is in the state described in steps (e) and (h); (j) measure a second pixel value for a pixel in the second image that corresponds to the sensor near the edge or corner of the focal plane arrangement; (k) calculating a first element of a matrix based on the set of reference pixel values and the first set of pixel values; and (l) calculating a second element of the matrix based on at least the reference pixel value and the second pixel value. Clause 29. The empirical method, according to clause 28, in which step (h) further comprises orienting the linear polarization filter with its geometric polarization axis parallel to one of a surface plane in the center of the camera lens or incident plane in the center of the camera lens. Clause 30. The empirical method, according to clause 28, which further comprises: (m) rotating the linear polarization filter by 90 °; (n) capturing a third image while the camera is in the state described in steps (e) and (m); and (o) measure a third pixel value for a pixel in the third image that corresponds to the sensor close to the edge or corner of the focal plane arrangement, in which, in step (1), the second element of the matrix is calculated based on at least the reference pixel value and the second and third pixel values. Clause 31.0 empirical method, according to clause 30, which additionally comprises computing an intensity coefficient based on the reference pixel value and the second and third pixel values. Clause 32. The empirical method, according to clause 31, in which, in step (1), the calculation of the second element of the matrix is additionally based on the intensity coefficient.
[00181] Although several modalities have been described so far in terms of aircraft, in other modalities, the platform can comprise: (a) spaceship that reorients between passages over a target; or (b) submerged boats or vehicles that take underwater photos. Modalities that use a needle balance do not even need to be on a vehicle: cameras mounted on the needle balance on a vehicle on the ground or in fixed installations can use movement of the needle balance to orient a camera and the attached filter itself. This can even be applied to a handheld camera, such as a smart phone, with a polarization filter attached to the front of the lens. Since many smart phones include accelerometers or other devices for sensing guidance, and they have processors and communication links, a smart phone with a polarization filter must be as capable as an airplane equipped with a camera to acquire polarized images and use them to produce polarimetry measurements.
[00182] Furthermore, although the aforementioned modalities refer to a CCD, the precepts described here can also be used with other electronic focal plane technologies or with a film camera and a scanning digitizer.
[00183] Although systems for acquiring polarimetric data have been described in relation to various modalities, it will be understood by those skilled in the art that various changes can be made and equivalents can be replaced by elements of these without departing from the scope of the claims presented below. In addition, many modifications can be made to adapt the precepts here exposed to a particular situation without departing from the scope of the claims.
[00184] As used in the claims, the term "computer system" should be interpreted broadly to encompass a system with at least one computer or processor, and which may have multiple computers or processors that communicate over a network or bus. As used in the previous sentence, the terms both "computer" and "processor" refer to devices with a processing unit (for example, a central processing unit) and some form of memory (that is, computer-readable media ) to store a program that is readable by the processing unit.
[00185] The claims of the method presented below should not be interpreted requiring that the steps cited be carried out in alphabetical order or in the order in which they are cited. Nor should they be interpreted to exclude any part of two or more steps that are performed concurrently or alternately.
[00186] As used in this description, the term "location" includes both position and orientation.
权利要求:
Claims (11)
[0001]
1. Method for determining a polarization of a scene, characterized by the fact that it comprises: (a) mounting a camera (16) comprising a lens (28) and an array (26) of sensors in a vehicle, preferably a non-vehicle manned and place a linear polarization filter (18) in a field of view of the camera (16); (b) maneuver the vehicle such that it successively locates the camera (16) and the linear polarization filter (18) in proximity to a single position, but in three different orientations for each of which a scene is in the field of view the camera (16); (c) capture first to third filtered images, at the same time that the camera (16) and the linear polarization filter (18) are in the three different orientations, respectively; (d) transferring first to third sets of image processing data representing, respectively, the first to third filtered images from the camera (16) to a computer system; and (e) compute a polarization of at least one point in the scene from the first to the third sets of imaging data.
[0002]
2. Method, according to claim 1, characterized by the fact that step (a) additionally comprises characterizing the polarization energy of the camera (16).
[0003]
3. Method according to claim 2, characterized by the fact that step (a) additionally comprises determining first and second elements of the Mueller matrix.
[0004]
4. Method according to any one of claims 1 to 3, characterized in that the respective angles around a line of sight of the camera (16) with respect to a reference for at least two of the three different orientations differ in one 45 ° odd integral multiple.
[0005]
Method according to any one of claims 1 to 4, characterized by the fact that step (e) comprises computing Stokes parameters.
[0006]
6. Method according to any one of claims 1 to 5, characterized in that it additionally comprises registering the first to third sets of image processing data in relation to each other before carrying out step (e).
[0007]
7. System to acquire images of a scene, characterized by the fact that it comprises: an unmanned vehicle; a camera (16) on board the unmanned vehicle, the camera (16) comprising a lens (28) and a sensor array (26); a first linear polarization filter (18) arranged in front of at least a first part of the sensor array (26); an unmanned vehicle control system capable of controlling the unmanned vehicle to perform maneuvers, the unmanned vehicle control system comprising computer-readable hardware and instructions, the computer-readable instructions of the unmanned vehicle control system being configured to control the unmanned vehicle to position itself in a specified position, or close to it, for each of the first, second and third occurrences and in the first, second and third orientations that are different from each other, but which, each , place the scene in a field of view of the camera (16); and a camera control system (16) arranged on board the unmanned vehicle and capable of controlling the camera (16) to capture images, the camera control system (16) comprising computer-readable hardware and instructions, the readable instructions computer control of the camera control system (16) being configured to control the camera (16) to capture first, second and third images of a target scene (12) during the first, second and third occurrences, respectively, and then transmitting first, second and third image processing data sets representing, respectively, the first, second and third images; and an image processing data processing system capable of processing image processing data, the image processing data processing system comprising computer readable hardware and instructions, the computer readable instructions of the data processing system. image treatment being configured to record the first, second and third image treatment data sets in relation to each other and compute polarization values for the treated image scene.
[0008]
8. System according to claim 7, characterized by the fact that the computer-readable instructions of the image processing data processing system are configured to compute polarization values for the treated image scene based, in part, in the stored data that represent a characterization of a polarization energy of the camera (16).
[0009]
9. System according to either of claims 7 or 8, characterized in that the polarization values comprise Stokes parameters.
[0010]
10. System according to any one of claims 7 to 9, characterized in that the respective angles around a line of sight of the camera (16) with respect to a reference for at least two of the first to third orientations differ in an integral multiple of 45 °.
[0011]
System according to any one of claims 7 to 10, characterized in that it additionally comprises a needle rocker (32) mounted on the unmanned vehicle, in which the camera (16) is rotatably coupled to the needle rocker ( 32), in particular, for rotation about a geometric axis that is parallel to an optical geometric axis of the camera (16), and where, optionally, the linear polarization filter (18) is attached to the camera (16).
类似技术:
公开号 | 公开日 | 专利标题
BR102015001708B1|2020-11-24|METHOD FOR DETERMINING A POLARIZATION OF A SCENE AND SYSTEM FOR ACQUIRING IMAGES OF A SCENE
CN105654549B|2018-05-15|Underwater 3 D reconstructing device and method based on structured light technique and photometric stereo
EP2619987B1|2017-09-20|Wide angle field of view active illumination imaging system
CN102509261B|2014-05-07|Distortion correction method for fisheye lens
CN106408601B|2018-12-14|A kind of binocular fusion localization method and device based on GPS
CN202172446U|2012-03-21|Wide angle photographing apparatus
CN102243432A|2011-11-16|Panoramic three-dimensional photographing device
Jordt2014|Underwater 3D reconstruction based on physical models for refraction and underwater light propagation
CN104052967B|2017-04-05|Target depth figure is polarized under intelligent water and obtains system and method
CN103206926B|2016-03-30|A kind of panorama three-dimensional laser scanner
CN203204299U|2013-09-18|Air 360-DEG panorama-photograph shooting apparatus
CN104346829A|2015-02-11|Three-dimensional color reconstruction system and method based on PMD | cameras and photographing head
CN105241450B|2019-02-05|Sky polarization mode detection method and system based on four-quadrant polarizing film
CN109211107B|2020-10-23|Measuring device, rotating body and method for generating image data
CN102692213A|2012-09-26|Traffic accident field surveying instrument based on active omnidirectional visual sensor
CN106292128A|2017-01-04|A kind of formation method based on polarization extinction
Corke2022|Image Formation
JP2004163271A|2004-06-10|Noncontact image measuring apparatus
Carey et al.2011|An insect-inspired omnidirectional vision system including UV-sensitivity and polarisation
CN110231021A|2019-09-13|Ripple sensor, ripple method for reconstructing and its application
CN105203102B|2018-07-17|Sky polarization mode detection method and system based on s- wave plates
CN103134664A|2013-06-05|In-orbit optical satellite camera modulation transfer function | measuring method based on convex reflector
CN110225249B|2021-04-06|Focusing method and device, aerial camera and unmanned aerial vehicle
CN201837826U|2011-05-18|All-directional stereoscopic vision system based on single camera
CN105352499B|2019-02-05|Sky polarization mode detection method and system based on more quadrant polarizing films
同族专利:
公开号 | 公开日
KR20150099416A|2015-08-31|
BR102015001708A2|2015-09-22|
EP2905590A1|2015-08-12|
CN104833424A|2015-08-12|
JP6516487B2|2019-05-22|
US20150219498A1|2015-08-06|
CA2870718C|2017-03-21|
EP2905590B1|2016-10-05|
US9464938B2|2016-10-11|
JP2015155896A|2015-08-27|
CA2870718A1|2015-08-06|
KR101784334B1|2017-10-11|
CN104833424B|2017-10-24|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

CN1451230A|2000-07-21|2003-10-22|纽约市哥伦比亚大学托管会|Method and apparatus for image mosaicing|
US6678046B2|2001-08-28|2004-01-13|Therma-Wave, Inc.|Detector configurations for optical metrology|
US7085622B2|2002-04-19|2006-08-01|Applied Material, Inc.|Vision system|
IL149934A|2002-05-30|2007-05-15|Rafael Advanced Defense Sys|Airborne reconnaissance system|
US7193214B1|2005-04-08|2007-03-20|The United States Of America As Represented By The Secretary Of The Army|Sensor having differential polarization and a network comprised of several such sensors|
US20070244608A1|2006-04-13|2007-10-18|Honeywell International Inc.|Ground control station for UAV|
BRPI0920950A2|2008-11-20|2015-12-29|Bae Systems Plc|aircraft and method of controlling the direction the flight direction of an aircraft|
JP2011186328A|2010-03-10|2011-09-22|Hitachi Maxell Ltd|Polarization diffraction grating array, polarization sensor, and polarization analysis device|
US8757900B2|2012-08-28|2014-06-24|Chapman/Leonard Studio Equipment, Inc.|Body-mounted camera crane|US10215642B2|2012-05-17|2019-02-26|The University Of Akron|System and method for polarimetric wavelet fractal detection and imaging|
WO2018165027A1|2017-03-06|2018-09-13|Polaris Sensor Technologies, Inc.|Polarization-based detection and mapping method and system|
US10395113B2|2014-01-22|2019-08-27|Polaris Sensor Technologies, Inc.|Polarization-based detection and mapping method and system|
WO2016033181A1|2014-08-26|2016-03-03|Digital Wind Systems, Inc.|Method and apparatus for contrast enhanced photography of wind turbine blades|
US9508263B1|2015-10-20|2016-11-29|Skycatch, Inc.|Generating a mission plan for capturing aerial images with an unmanned aerial vehicle|
US10008123B2|2015-10-20|2018-06-26|Skycatch, Inc.|Generating a mission plan for capturing aerial images with an unmanned aerial vehicle|
US10094928B2|2016-02-19|2018-10-09|The Govemment of the United States ofAmerica, as represented by the Secretary of the Navy|Turbulence ocean lidar|
US9959772B2|2016-06-10|2018-05-01|ETAK Systems, LLC|Flying lane management systems and methods for unmanned aerial vehicles|
US10789853B2|2016-06-10|2020-09-29|ETAK Systems, LLC|Drone collision avoidance via air traffic control over wireless networks|
CN108318458B|2017-01-16|2020-10-09|北京航空航天大学|Method for measuring outdoor typical feature pBRDFsuitable for different weather conditions|
CN110132420B|2018-02-09|2020-11-27|上海微电子装备(集团)股份有限公司|Polarization measuring device, polarization measuring method, and optical alignment method|
EP3578126A1|2018-06-08|2019-12-11|Stryker European Holdings I, LLC|Surgical navigation system|
CN109002796B|2018-07-16|2020-08-04|阿里巴巴集团控股有限公司|Image acquisition method, device and system and electronic equipment|
US10819082B2|2018-07-26|2020-10-27|The Government Of The United States Of America, As Represented By The Secretary Of The Navy|Multifrequency ocean lidar power optimizer|
WO2021015669A1|2019-07-19|2021-01-28|National University Of Singapore|Method for aligning an autonomous mobile apparatus to a reference object, an autonomous mobile apparatus, and a guidance module thereof|
法律状态:
2015-09-22| B03A| Publication of an application: publication of a patent application or of a certificate of addition of invention|
2018-10-30| B06F| Objections, documents and/or translations needed after an examination request according art. 34 industrial property law|
2020-04-22| B06U| Preliminary requirement: requests with searches performed by other patent offices: suspension of the patent application procedure|
2020-09-08| B09A| Decision: intention to grant|
2020-11-24| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 26/01/2015, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
US14/174,652|2014-02-06|
US14/174,652|US9464938B2|2014-02-06|2014-02-06|Systems and methods for measuring polarization of light in images|
[返回顶部]